62 research outputs found

    Turing Pattern Dynamics for Spatiotemporal Models with Growth and Curvature

    Get PDF
    Turing theory plays an important role in real biological pattern formation problems, such as solid tumor growth and animal coat patterns. To understand how patterns form and develop over time due to growth, we consider spatiotemporal patterns, in particular Turing patterns, for reaction diffusion systems on growing surfaces with curvature. Of particular interest is isotropic growth of the sphere, where growth of the domain occurs in the same proportion in all directions. Applying a modified linear stability analysis and a separation of timescales argument, we derive the necessary and sufficient conditions for a diffusion driven instability of the steady state and for the emergence of spatial patterns. Finally, we explore these results using numerical simulations

    Turing Patterns on Growing Spheres: The Exponential Case

    Get PDF
    We consider Turing patterns for reaction-diffusion systems on the surface of a growing sphere. In particular, we are interested in the effect of dynamic growth on the pattern formation. We consider exponential isotropic growth of the sphere and perform a linear stability analysis and compare the results with numerical simulations

    Optimal Sensory Coding By Populations Of ON And OFF Neurons

    Get PDF
    In many sensory systems the neural signal is coded by multiple parallel pathways, suggesting an evolutionary fitness benefit of general nature. A common pathway splitting is that into ON and OFF cells, responding to stimulus increments and decrements, respectively. According to efficient coding theory, sensory neurons have evolved to an optimal configuration for maximizing information transfer given the structure of natural stimuli and circuit constraints. Using the efficient coding framework, we describe two aspects of neural coding: how to optimally split a population into ON and OFF pathways, and how to allocate the firing thresholds of individual neurons given realistic noise levels, stimulus distributions and optimality measures. We find that populations of ON and OFF neurons convey equal information about the stimulus regardless of the ON/OFF mixture, once the thresholds are chosen optimally, independent of stimulus statistics and noise. However, an equal ON/OFF mixture is the most efficient as it uses the fewest spikes to convey this information. The optimal thresholds and coding efficiency, however, depend on noise and stimulus statistics if information is decoded by an optimal linear readout. With non-negligible noise, mixed ON/OFF populations reap significant advantages compared to a homogeneous population. The best coding performance is achieved by a unique mixture of ON/OFF neurons tuned to stimulus asymmetries and noise. We provide a theory for how different cell types work together to encode the full stimulus range using a diversity of response thresholds. The optimal ON/OFF mixtures derived from the theory accord with certain biases observed experimentally

    Computational implications of biophysical diversity and multiple timescales in neurons and synapses for circuit performance

    Full text link
    Despite advances in experimental and theoretical neuroscience, we are still trying to identify key biophysical details that are important for characterizing the operation of brain circuits. Biological mechanisms at the level of single neurons and synapses can be combined as ‘building blocks’ to generate circuit function. We focus on the importance of capturing multiple timescales when describing these intrinsic and synaptic components. Whether inherent in the ionic currents, the neuron’s complex morphology, or the neurotransmitter composition of synapses, these multiple timescales prove crucial for capturing the variability and richness of circuit output and enhancing the information-carrying capacity observed across nervous systems

    Benefits of Pathway Splitting in Sensory Coding

    Get PDF
    In many sensory systems, the neural signal splits into multiple parallel pathways. For example, in the mammalian retina, ∼20 types of retinal ganglion cells transmit information about the visual scene to the brain. The purpose of this profuse and early pathway splitting remains unknown. We examine a common instance of splitting into ON and OFF neurons excited by increments and decrements of light intensity in the visual scene, respectively. We test the hypothesis that pathway splitting enables more efficient encoding of sensory stimuli. Specifically, we compare a model system with an ON and an OFF neuron to one with two ON neurons. Surprisingly, the optimal ON–OFF system transmits the same information as the optimal ON–ON system, if one constrains the maximal firing rate of the neurons. However, the ON–OFF system uses fewer spikes on average to transmit this information. This superiority of the ON–OFF system is also observed when the two systems are optimized while constraining their mean firing rate. The efficiency gain for the ON–OFF split is comparable with that derived from decorrelation, a well known processing strategy of early sensory systems. The gain can be orders of magnitude larger when the ecologically important stimuli are rare but large events of either polarity. The ON–OFF system also provides a better code for extracting information by a linear downstream decoder. The results suggest that the evolution of ON–OFF diversification in sensory systems may be driven by the benefits of lowering average metabolic cost, especially in a world in which the relevant stimuli are sparse

    Implications of single-neuron gain scaling for information transmission in networks

    Get PDF
    Summary: 

Many neural systems are equipped with mechanisms to efficiently encode sensory information. To represent natural stimuli with time-varying statistical properties, neural systems should adjust their gain to the inputs' statistical distribution. Such matching of dynamic range to input statistics has been shown to maximize the information transmitted by the output spike trains (Brenner et al., 2000, Fairhall et al., 2001). Gain scaling has not only been observed as a system response property, but also in single neurons in developing somatosensory cortex stimulated with currents of different amplitude (Mease et al., 2010). While gain scaling holds for cortical neurons at the end of the first post-natal week, at birth these neurons lack this property. The observed improvement in gain scaling coincides with the disappearance of spontaneous waves of activity in cortex (Conheim et al., 2010).

We studied how single-neuron gain scaling affects the dynamics of signal transmission in networks, using the developing cortex as a model. In a one-layer feedforward network, we showed that the absence of gain control made the network relatively insensitive to uncorrelated local input fluctuations. As a result, these neurons selectively and synchronously responded to large slowly-varying correlated input--the slow build up of synaptic noise generated in pacemaker circuits which most likely triggers waves. Neurons in gain scaling networks were more sensitive to the small-scale input fluctuations, and responded asynchronously to the slow envelope. Thus, gain scaling both increases information in individual neurons about private inputs and allows the population average to encode the slow fluctuations in the input. Paradoxically, the synchronous firing that corresponds to wave propagation is associated with low information transfer. We therefore suggest that the emergence of gain scaling may help the system to increase information transmission on multiple timescales as sensory stimuli become important later in development. 

Methods:

Networks with one and two layers consisting of hundreds of model neurons were constructed. The ability of single neurons to gain scale was controlled by changing the ratio of sodium to potassium conductances in Hodgkin-Huxley neurons (Mainen et al., 1995). The response of single layer networks was studied with ramp-like stimuli with slopes that varied over several hundreds of milliseconds. Fast fluctuations were superimposed on this slowly-varying mean. Then the response to these networks was tested with continuous stimuli. Gain scaling networks captured the slow fluctuations in the inputs, while non-scaling networks simply thresholded the input. Quantifying information transmission confirmed that gain scaling neurons transmit more information about the stimulus. With the two-layer networks we simulated a cortical network where waves could spontaneously emerge, propagate and degrade, based on the gain scaling properties of the neurons in the network

    Homeostatic Activity-Dependent Tuning of Recurrent Networks for Robust Propagation of Activity.

    Get PDF
    UNLABELLED: Developing neuronal networks display spontaneous bursts of action potentials that are necessary for circuit organization and tuning. While spontaneous activity has been shown to instruct map formation in sensory circuits, it is unknown whether it plays a role in the organization of motor networks that produce rhythmic output. Using computational modeling, we investigate how recurrent networks of excitatory and inhibitory neuronal populations assemble to produce robust patterns of unidirectional and precisely timed propagating activity during organism locomotion. One example is provided by the motor network inDrosophilalarvae, which generates propagating peristaltic waves of muscle contractions during crawling. We examine two activity-dependent models, which tune weak network connectivity based on spontaneous activity patterns: a Hebbian model, where coincident activity in neighboring populations strengthens connections between them; and a homeostatic model, where connections are homeostatically regulated to maintain a constant level of excitatory activity based on spontaneous input. The homeostatic model successfully tunes network connectivity to generate robust activity patterns with appropriate timing relationships between neighboring populations. These timing relationships can be modulated by the properties of spontaneous activity, suggesting its instructive role for generating functional variability in network output. In contrast, the Hebbian model fails to produce the tight timing relationships between neighboring populations required for unidirectional activity propagation, even when additional assumptions are imposed to constrain synaptic growth. These results argue that homeostatic mechanisms are more likely than Hebbian mechanisms to tune weak connectivity based on spontaneous input in a recurrent network for rhythm generation and robust activity propagation. SIGNIFICANCE STATEMENT: How are neural circuits organized and tuned to maintain stable function and produce robust output? This task is especially difficult during development, when circuit properties change in response to variable environments and internal states. Many developing circuits exhibit spontaneous activity, but its role in the synaptic organization of motor networks that produce rhythmic output is unknown. We studied a model motor network, that when appropriately tuned, generates propagating activity as during crawling inDrosophilalarvae. Based on experimental evidence of activity-dependent tuning of connectivity, we examined plausible mechanisms by which appropriate connectivity emerges. Our results suggest that activity-dependent homeostatic mechanisms are better suited than Hebbian mechanisms for organizing motor network connectivity, and highlight an important difference from sensory areas.This work was supported by Cambridge Overseas Research Fund, Trinity College, and Swartz Foundation to J.G. and Wellcome Trust VIP funding to J.F.E. through Program Grant WT075934 to Michael Bate and Matthias Landgraf. J.G. is also supported by Burroughs-Wellcome Fund Career Award at the Scientific Interface.This is the final version of the article. It first appeared from the Society for Neuroscience via https://doi.org/10.1523/JNEUROSCI.2511-15.201

    Intrinsic Neuronal Properties Switch the Mode of Information Transmission in Networks

    Get PDF
    Diverse ion channels and their dynamics endow single neurons with complex biophysical properties. These properties determine the heterogeneity of cell types that make up the brain, as constituents of neural circuits tuned to perform highly specific computations. How do biophysical properties of single neurons impact network function? We study a set of biophysical properties that emerge in cortical neurons during the first week of development, eventually allowing these neurons to adaptively scale the gain of their response to the amplitude of the fluctuations they encounter. During the same time period, these same neurons participate in large-scale waves of spontaneously generated electrical activity. We investigate the potential role of experimentally observed changes in intrinsic neuronal properties in determining the ability of cortical networks to propagate waves of activity. We show that such changes can strongly affect the ability of multi-layered feedforward networks to represent and transmit information on multiple timescales. With properties modeled on those observed at early stages of development, neurons are relatively insensitive to rapid fluctuations and tend to fire synchronously in response to wave-like events of large amplitude. Following developmental changes in voltage-dependent conductances, these same neurons become efficient encoders of fast input fluctuations over few layers, but lose the ability to transmit slower, population-wide input variations across many layers. Depending on the neurons' intrinsic properties, noise plays different roles in modulating neuronal input-output curves, which can dramatically impact network transmission. The developmental change in intrinsic properties supports a transformation of a networks function from the propagation of network-wide information to one in which computations are scaled to local activity. This work underscores the significance of simple changes in conductance parameters in governing how neurons represent and propagate information, and suggests a role for background synaptic noise in switching the mode of information transmission

    The Role of Vaccination in the Control of SARS

    Get PDF
    We assess pre-outbreak and during-outbreak vaccination as control strategies for SARS epidemics using a mathematical model that includes susceptible, latent (traced and untraced), infectious, isolated and recovered individuals. Scenarios focusing on policies that include contact tracing and levels of self-isolation among untraced infected individuals are explored. Bounds on the proportion of pre-outbreak successfully vaccinated individuals are provided using the the basic reproductive number. Uncertainty and sensitivity analyses on the reproductive number are carried out. The final epidemic size under different vaccination scenarios is computed
    • …
    corecore